Making use of Amazon Kinesis Firehose to pump Data to S3

Making use of Amazon Kinesis Firehose to pump Data to S3

Clock Icon2022.04.26

この記事は公開されてから1年以上経過しています。情報が古い可能性がありますので、ご注意ください。

Amazon Kinesis Firehose

An AWS service that reliably loads data streams directly into AWS products for processing. A managed service where Scaling is done automatically, which is up to gigabytes per second and also allows batching, encrypting, and compressing. It also allows for streaming to S3, Elasticsearch Service, or Redshift, where data can be copied for processing through additional services.

S3

Simple and popular AWS Service for storage. Replicates data by default across multiple facilities. It charges per usage. It is deeply integrated with AWS Services. Buckets are logical storage units. Objects are data added to the bucket. S3 has a storage class on object level which can save money by moving less frequently accessed objects to a colder storage class.

Demo

At First, create an S3 bucket, give a name, and keep other settings as default Now go to Kinesis and click delivery streams In this, we are choosing source data directly to put data from the endpoint and the destination is s3 Browse through and select the above created S3 bucket Keep other settings as default and click on create a delivery stream Now testing the delivery system by sending demo data from kinesis Firehose to S3 Now keep this on for some time till data reaches S3 Scrolling down you can see how much data was pumped and also requests per second Now going to S3 and checking the data Downloading the data and checking it

Share this article

facebook logohatena logotwitter logo

© Classmethod, Inc. All rights reserved.